FORBID: Fast Overlap Removal by Stochastic GradIent Descent for Graph Drawing

نویسندگان

چکیده

While many graph drawing algorithms consider nodes as points, visualization tools often represent them shapes. These shapes support the display of information such labels or encode various data with size color. However, they can create overlaps between which hinder exploration process by hiding parts information. It is therefore utmost importance to remove these improve readability. If not handled layout process, Overlap Removal (OR) have been proposed post-processing. As layouts usually convey about their topology, it important that OR preserve much possible. We propose a novel algorithm models joint stress and scaling optimization problem, leverages efficient stochastic gradient descent. This approach compared state-of-the-art algorithms, several quality metrics demonstrate its efficiency quickly while retaining initial structures.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Node Overlap Removal in Graph Layout

Most graph layout algorithms in the field of graph drawing treat nodes as points. The problem of node overlap removal is to adjust the layout generated by such methods so that nodes of non-zero width and height do not overlap, yet are as close as possible to their original positions. We give an O(n logn) algorithm for achieving this assuming that the number of nodes overlapping any single node ...

متن کامل

A Fast Distributed Stochastic Gradient Descent Algorithm for Matrix Factorization

The accuracy and effectiveness of matrix factorization technique were well demonstrated in the Netflix movie recommendation contest. Among the numerous solutions for matrix factorization, Stochastic Gradient Descent (SGD) is one of the most widely used algorithms. However, as a sequential approach, SGD algorithm cannot directly be used in the Distributed Cluster Environment (DCE). In this paper...

متن کامل

Variational Stochastic Gradient Descent

In Bayesian approach to probabilistic modeling of data we select a model for probabilities of data that depends on a continuous vector of parameters. For a given data set Bayesian theorem gives a probability distribution of the model parameters. Then the inference of outcomes and probabilities of new data could be found by averaging over the parameter distribution of the model, which is an intr...

متن کامل

Byzantine Stochastic Gradient Descent

This paper studies the problem of distributed stochastic optimization in an adversarial setting where, out of the m machines which allegedly compute stochastic gradients every iteration, an α-fraction are Byzantine, and can behave arbitrarily and adversarially. Our main result is a variant of stochastic gradient descent (SGD) which finds ε-approximate minimizers of convex functions in T = Õ ( 1...

متن کامل

Parallelized Stochastic Gradient Descent

With the increase in available data parallel machine learning has become an in-creasingly pressing problem. In this paper we present the first parallel stochasticgradient descent algorithm including a detailed analysis and experimental evi-dence. Unlike prior work on parallel optimization algorithms [5, 7] our variantcomes with parallel acceleration guarantees and it poses n...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2023

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-22203-0_6